Goto

Collaborating Authors

 companion computer


Flying Robotics Art: ROS-based Drone Draws the Record-Breaking Mural

Korigodskii, Andrei A., Kalachev, Oleg D., Vasiunik, Artem E., Urvantsev, Matvei V., Bondar, Georgii E.

arXiv.org Artificial Intelligence

This paper presents the innovative design and successful deployment of a pioneering autonomous unmanned aerial system developed for executing the world's largest mural painted by a drone. Addressing the dual challenges of maintaining artistic precision and operational reliability under adverse outdoor conditions such as wind and direct sunlight, our work introduces a robust system capable of navigating and painting outdoors with unprecedented accuracy. Key to our approach is a novel navigation system that combines an infrared (IR) motion capture camera and LiDAR technology, enabling precise location tracking tailored specifically for largescale artistic applications. We employ a unique control architecture that uses different regulation in tangential and normal directions relative to the planned path, enabling precise trajectory tracking and stable line rendering. We also present algorithms for trajectory planning and path optimization, allowing for complex curve drawing and area filling. The system includes a custom-designed paint spraying mechanism, specifically engineered to function effectively amidst the turbulent airflow generated by the drone's propellers, which also protects the drone's critical components from paint-related damage, ensuring longevity and consistent performance. Experimental results demonstrate the system's robustness and precision in varied conditions, showcasing its potential for autonomous large-scale art creation and expanding the functional applications of robotics in creative fields.


A Modular and Scalable System Architecture for Heterogeneous UAV Swarms Using ROS 2 and PX4-Autopilot

Pommeranz, Robert, Tebbe, Kevin, Heynicke, Ralf, Scholl, Gerd

arXiv.org Artificial Intelligence

In this paper a modular and scalable architecture for heterogeneous swarm-based Counter Unmanned Aerial Systems (C-UASs) built on PX4-Autopilot and Robot Operating System 2 (ROS 2) framework is presented. The proposed architecture emphasizes seamless integration of hardware components by introducing independent ROS 2 nodes for each component of a Unmanned Aerial Vehicle (UAV). Communication between swarm participants is abstracted in software, allowing the use of various technologies without architectural changes. Key functionalities are supported, e.g. leader following and formation flight to maneuver the swarm. The system also allows computer vision algorithms to be integrated for the detection and tracking of UAVs. Additionally, a ground station control is integrated for the coordination of swarm operations. Swarm-based Unmanned Aerial System (UAS) architecture is verified within a Gazebo simulation environment but also in real-world demonstrations.


ROSflight 2.0: Lean ROS 2-Based Autopilot for Unmanned Aerial Vehicles

Moore, Jacob, Tokumaru, Phil, Reid, Ian, Sutherland, Brandon, Ritchie, Joseph, Snow, Gabe, McLain, Tim

arXiv.org Artificial Intelligence

ROSflight 2.0: Lean ROS 2-Based Autopilot for Unmanned Aerial V ehicles Abstract-- ROSflight is a lean, open-source autopilot ecosystem for unmanned aerial vehicles (UA Vs). Designed by researchers for researchers, it is built to lower the barrier to entry to UA V research and accelerate the transition from simulation to hardware experiments by maintaining a lean (not full-featured), well-documented, and modular codebase. This publication builds on previous treatments and describes significant additions to the architecture that improve the modularity and usability of ROSflight, including the transition from ROS 1 to ROS 2, supported hardware, low-level actuator mixing, and the simulation environment. We believe that these changes improve the usability of ROSflight and enable ROSflight to accelerate research in areas like advanced-air mobility. Hardware results are provided, showing that ROSflight is able to control a multirotor over a serial connection at 400 Hz while closing all control loops on the companion computer . In recent years, interest in unmanned aerial vehicles (UA Vs) has increased significantly. Technological advances have enabled numerous applications of UA Vs, including package delivery, photography, search-and-rescue, firefighting, as well as military applications. Advanced air mobility (AAM), a category broadly referring to increasing autonomy in urban areas for civilian use, is also currently an area of high interest.


Design and Implementation of a Dual Uncrewed Surface Vessel Platform for Bathymetry Research under High-flow Conditions

Kumar, Dinesh, Ghorbanpour, Amin, Yen, Kin, Soltani, Iman

arXiv.org Artificial Intelligence

Bathymetry, the study of underwater topography, relies on sonar mapping of submerged structures. These measurements, critical for infrastructure health monitoring, often require expensive instrumentation. The high financial risk associated with sensor damage or vessel loss creates a reluctance to deploy uncrewed surface vessels (USVs) for bathymetry. However, the crewed-boat bathymetry operations, are costly, pose hazards to personnel, and frequently fail to achieve the stable conditions necessary for bathymetry data collection, especially under high currents. Further research is essential to advance autonomous control, navigation, and data processing technologies, with a particular focus on bathymetry. There is a notable lack of accessible hardware platforms that allow for integrated research in both bathymetry-focused autonomous control and navigation, as well as data evaluation and processing. This paper addresses this gap through the design and implementation of two complementary USV systems tailored for uncrewed bathymetry research. This includes a low-cost USV for Navigation And Control research (NAC-USV) and a second, high-end USV equipped with a high-resolution multi-beam sonar and the associated hardware for Bathymetry data quality Evaluation and Post-processing research (BEP-USV). The NAC-USV facilitates the investigation of autonomous, fail-safe navigation and control, emphasizing the stability requirements for high-quality bathymetry data collection while minimizing the risk to equipment. The BEP-USV, which mirrors the NAC-USV hardware, is then used for additional control validation and in-depth exploration of bathymetry data evaluation and post-processing methodologies. We detail the design and implementation of both systems, and open source the design. Furthermore, we demonstrate the system's effectiveness in a range of operational scenarios.


CloudTrack: Scalable UAV Tracking with Cloud Semantics

Blei, Yannik, Krawez, Michael, Nilavadi, Nisarga, Kaiser, Tanja Katharina, Burgard, Wolfram

arXiv.org Artificial Intelligence

Nowadays, unmanned aerial vehicles (UAVs) are commonly used in search and rescue scenarios to gather information in the search area. The automatic identification of the person searched for in aerial footage could increase the autonomy of such systems, reduce the search time, and thus increase the missed person's chances of survival. In this paper, we present a novel approach to perform semantically conditioned open vocabulary object tracking that is specifically designed to cope with the limitations of UAV hardware. Our approach has several advantages. It can run with verbal descriptions of the missing person, e.g., the color of the shirt, it does not require dedicated training to execute the mission and can efficiently track a potentially moving person. Our experimental results demonstrate the versatility and efficacy of our approach.


Attention Meets UAVs: A Comprehensive Evaluation of DDoS Detection in Low-Cost UAVs

Sharma, Ashish, Vaddhiparthy, SVSLN Surya Suhas, Goparaju, Sai Usha, Gangadharan, Deepak, Kandath, Harikumar

arXiv.org Artificial Intelligence

This paper explores the critical issue of enhancing cybersecurity measures for low-cost, Wi-Fi-based Unmanned Aerial Vehicles (UAVs) against Distributed Denial of Service (DDoS) attacks. In the current work, we have explored three variants of DDoS attacks, namely Transmission Control Protocol (TCP), Internet Control Message Protocol (ICMP), and TCP + ICMP flooding attacks, and developed a detection mechanism that runs on the companion computer of the UAV system. As a part of the detection mechanism, we have evaluated various machine learning, and deep learning algorithms, such as XGBoost, Isolation Forest, Long Short-Term Memory (LSTM), Bidirectional-LSTM (Bi-LSTM), LSTM with attention, Bi-LSTM with attention, and Time Series Transformer (TST) in terms of various classification metrics. Our evaluation reveals that algorithms with attention mechanisms outperform their counterparts in general, and TST stands out as the most efficient model with a run time of 0.1 seconds. TST has demonstrated an F1 score of 0.999, 0.997, and 0.943 for TCP, ICMP, and TCP + ICMP flooding attacks respectively. In this work, we present the necessary steps required to build an on-board DDoS detection mechanism. Further, we also present the ablation study to identify the best TST hyperparameters for DDoS detection, and we have also underscored the advantage of adapting learnable positional embeddings in TST for DDoS detection with an improvement in F1 score from 0.94 to 0.99.


Race Against the Machine: a Fully-annotated, Open-design Dataset of Autonomous and Piloted High-speed Flight

Bosello, Michael, Aguiari, Davide, Keuter, Yvo, Pallotta, Enrico, Kiade, Sara, Caminati, Gyordan, Pinzarrone, Flavio, Halepota, Junaid, Panerati, Jacopo, Pau, Giovanni

arXiv.org Artificial Intelligence

Unmanned aerial vehicles, and multi-rotors in particular, can now perform dexterous tasks in impervious environments, from infrastructure monitoring to emergency deliveries. Autonomous drone racing has emerged as an ideal benchmark to develop and evaluate these capabilities. Its challenges include accurate and robust visual-inertial odometry during aggressive maneuvers, complex aerodynamics, and constrained computational resources. As researchers increasingly channel their efforts into it, they also need the tools to timely and equitably compare their results and advances. With this dataset, we want to (i) support the development of new methods and (ii) establish quantitative comparisons for approaches coming from the broader robotics, controls, and artificial intelligence communities. We want to provide a one-stop resource that is comprehensive of (i) aggressive autonomous and piloted flight, (ii) high-resolution, high-frequency visual, inertial, and motion capture data, (iii) commands and control inputs, (iv) multiple light settings, and (v) corner-level labeling of drone racing gates. We also release the complete specifications to recreate our flight platform, using commercial off-the-shelf components and the open-source flight controller Betaflight. Our dataset, open-source scripts, and drone design are available at: https://github.com/tii-racing/drone-racing-dataset.


Reducing Object Detection Uncertainty from RGB and Thermal Data for UAV Outdoor Surveillance

Sandino, Juan, Caccetta, Peter A., Sanderson, Conrad, Maire, Frederic, Gonzalez, Felipe

arXiv.org Artificial Intelligence

Recent advances in Unmanned Aerial Vehicles (UAVs) have resulted in their quick adoption for wide a range of civilian applications, including precision agriculture, biosecurity, disaster monitoring and surveillance. UAVs offer low-cost platforms with flexible hardware configurations, as well as an increasing number of autonomous capabilities, including take-off, landing, object tracking and obstacle avoidance. However, little attention has been paid to how UAVs deal with object detection uncertainties caused by false readings from vision-based detectors, data noise, vibrations, and occlusion. In most situations, the relevance and understanding of these detections are delegated to human operators, as many UAVs have limited cognition power to interact autonomously with the environment. This paper presents a framework for autonomous navigation under uncertainty in outdoor scenarios for small UAVs using a probabilistic-based motion planner. The framework is evaluated with real flight tests using a sub 2 kg quadrotor UAV and illustrated in victim finding Search and Rescue (SAR) case study in a forest/bushland. The navigation problem is modelled using a Partially Observable Markov Decision Process (POMDP), and solved in real time onboard the small UAV using Augmented Belief Trees (ABT) and the TAPIR toolkit. Results from experiments using colour and thermal imagery show that the proposed motion planner provides accurate victim localisation coordinates, as the UAV has the flexibility to interact with the environment and obtain clearer visualisations of any potential victims compared to the baseline motion planner. Incorporating this system allows optimised UAV surveillance operations by diminishing false positive readings from vision-based object detectors.


How ornithopters can perch autonomously on a branch

Zufferey, Raphael, Barbero, Jesus Tormo, Talegon, Daniel Feliu, Nekoo, Saeed Rafee, Acosta, Jose Angel, Ollero, Anibal

arXiv.org Artificial Intelligence

Flapping wings are a bio-inspired method to produce lift and thrust in aerial robots, leading to quiet and efficient motion. The advantages of this technology are safety and maneuverability, and physical interaction with the environment, humans, and animals. However, to enable substantial applications, these robots must perch and land. Despite recent progress in the perching field, flapping-wing vehicles, or ornithopters, are to this day unable to stop their flight on a branch. In this paper, we present a novel method that defines a process to reliably and autonomously land an ornithopter on a branch. This method describes the joint operation of a flapping-flight controller, a close-range correction system and a passive claw appendage. Flight is handled by a triple pitch-yaw-altitude controller and integrated body electronics, permitting perching at 3 m/s. The close-range correction system, with fast optical branch sensing compensates for position misalignment when landing. This is complemented by a passive bistable claw design can lock and hold 2 Nm of torque, grasp within 25 ms and can re-open thanks to an integrated tendon actuation. The perching method is supplemented by a four-step experimental development process which optimizes for a successful design. We validate this method with a 700 g ornithopter and demonstrate the first autonomous perching flight of a flapping-wing robot on a branch, a result replicated with a second robot. This work paves the way towards the application of flapping-wing robots for long-range missions, bird observation, manipulation, and outdoor flight.